Archive for the ‘Evolution’ Category
Very interesting, given my low-carb way of eating. John Upton writes in the Pacific Standard:
In a Western world of plentiful crop cultivation, packeted convenience, and overconsumption, carbohydrates can be a curse. The sugar in today’s doses is often toxic, and plentiful servings of carb-rich wheat, rice, and corn can conspire with it to fuel plagues of obesity and diabetes.
But in worlds inhabited by our distant ancestors, before agriculture and production lines and Cap’n Crunch, carbs delivered charitable bursts of energy that weren’t so easy to find. It’s not difficult to imagine how evolution ensured that mammals came to perceive sugar as delicious. But now science is unlocking secrets about this hardwired allure of carbs that goes beyond their obviously alluring flavor.
Our mouths possess a subtle sense—one that’s unlike taste or smell. It’s linked to signaling pathways that recognize the presence of sugars, even sans flavor, and trigger a cerebral response. The response doesn’t just reward our carb consumption with gratifying deliciousness and dopamine. It green-lights the burning of additional energy and improves our physical performance.
Research published four years ago, for example, showed that even rodents with no sense of sweetness showed a glutton-like preference for sugary foods over similarly-caloric proteins. The brains of these taste-stunted rats still released dopamine when sugar was consumed, just as a normal rat’s brain would after chowing down on a cupcake. A body of research dating back to 2008 has shown that merely gargling carbohydrate solution during exercise can boost performance.
To test whether this mysterious carb-detecting sense activated parts of the brain that are independent of those that detect sweetness, New Zealand researchers put healthy volunteers into MRIs. The volunteers were instructed to pinch a device with a particular force whenever a cross was lit up in front of their eyes. During some of the tests, the volunteers swished solutions containing maltodextrin, a relatively flavorless type of sugar, in their mouths. In others, a flavor-matching but sugar-free placebo was used. . .
Continue reading. Perhaps Taubes was wrong about the insulin, but he seems to have been right to be suspicious of carbs.
Fascinating answer at the top.
Fascinating article, and it’s worth noting that global warming will bring more tropical diseases to the US, particularly in the Southeast, where climate is (so far) humid. Indeed, mosquitoes carrying dengue fever are already here, I believe. And Ed Yong explains well how difficult it is to stop malaria:
The meandering Moei river marks the natural boundary between Thailand and Myanmar. Its muddy waters are at their fullest, but François Nosten still crosses them in just a minute, aboard a narrow, wooden boat. In the dry season, he could wade across. As he steps onto the western riverbank, in Myanmar, he passes no checkpoint and presents no passport.
The air is cool. After months of rain, the surrounding jungle pops with vivid lime and emerald hues. Nosten climbs a set of wooden slats that wind away from the bank, up a muddy slope. His pace, as ever, seems relaxed and out of kilter with his almost permanently grave expression and urgent purpose. Nosten, a rangy Frenchman with tousled brown hair and glasses, is one of the world’s leading experts on malaria. He is here to avert a looming disaster. At the top of the slope, he reaches a small village of simple wooden buildings with tin and thatch roofs. This is Hka Naw Tah, home to around 400 people and a testing ground for Nosten’s bold plan to completely stamp out malaria from this critical corner of the world.
Malaria is the work of the single-celled Plasmodium parasites, and Plasmodium falciparum chief among them. They spread between people through the bites of mosquitoes, invading first the liver, then the red blood cells. The first symptoms are generic and flu-like: fever, headache, sweats and chills, vomiting. At that point, the immune system usually curtails the infection. But if the parasites spread to the kidneys, lungs, and brain, things go downhill quickly. Organs start failing. Infected red blood cells clog the brain’s blood vessels, depriving it of oxygen and leading to seizures, unconsciousness, and death.
When Nosten first arrived in Southeast Asia almost 30 years ago, malaria was the biggest killer in the region. Artemisinin changed everything. Spectacularly fast and effective, the drug arrived on the scene in 1994, when options for treating malaria were running out. Since then, “cases have just gone down, down, down,” says Nosten. “I’ve never seen so few in the rainy season—a few hundred this year compared to tens of thousands before.”
But he has no time for celebration. Artemisinin used to clear P. falciparum in a day; now, it can take several. The parasite has started to become resistant. The wonder drug is failing. It is the latest reprise of a decades-long theme: We attack malaria with a new drug, it mounts an evolutionary riposte.
Back in his office, Nosten pulls up a map showing the current whereabouts of the resistant parasites. Three colored bands highlight the borders between Cambodia and Vietnam, Cambodia and Thailand, and Thailand and Myanmar (Burma). Borders. Bold lines on maps, but invisible in reality. A river that can be crossed in a rickety boat is no barrier to a parasite that rides in the salivary glands of mosquitoes or the red blood cells of humans.
History tells us what happens next. Over the last century, almost every frontline antimalarial drug—chloroquine, sulfadoxine, pyrimethamine—has become obsolete because of defiant parasites that emerged from western Cambodia. From this cradle of resistance, the parasites gradually spread west to Africa, causing the deaths of millions. Malaria already kills around 660,000 people every year, and most of them are African kids. If artemisinin resistance reached that continent, it would be catastrophic, especially since there are no good replacement drugs on the immediate horizon.
Nosten thinks that without radical measures, resistance will spread to India and Bangladesh. Once that happens, it will be too late. Those countries are too big, too populous, too uneven in their health services to even dream about containing the resistant parasites. Once there, they will inevitably spread further. He thinks it will happen in three years, maybe four. “Look at the speed of change on this border. It’s exponential. It’s not going to take 10 or 15 years to reach Bangladesh. It’ll take just a few. We have to do something before it’s too late.”
Hundreds of scientists are developing innovative new ways of dealing with malaria, from potential vaccines to new drugs, genetically modified mosquitoes to lethal fungi. As Nosten sees it, none of these will be ready in time. The only way of stopping artemisinin resistance, he says, is to completely remove malaria from its cradle of resistance. “If you want to eliminate artemisinin resistance, you have to eliminate malaria,” says Nosten. Not control it, not contain it. Eliminate it.
That makes the Moei river more than a border between nations. It’s Stalingrad. It’s Thermopylae. It’s the last chance for halting the creeping obsolescence of our best remaining drug. What happens here will decide the fate of millions.
THE WORLD TRIED TO eliminate malaria 60 years ago. . .
I wonder how those who deny evolution explain the development of resistance to pesticides and antibiotics. From an evolutionary point of view, it’s simple: organisms that are vulnerable die without reproducing, but in a population there’s enough genetic diversity to (almost always) result in some who are less affected. These go on to reproduce, with offspring also less affected. Any that are totally unaffected leave a lot of similar progeny, so soon we say “the mosquitoes have become resistant to DDT.” And evolutionarily, it’s easy to understand. But how do non-evolutionists explain it? and the genetic changes in the immune population?
Things always seem to be more complex than expected as one delves more deeply into them. Take genes, for example: Michael White writes in Pacific Standard:
Today, DNA is central to modern biology, but scarcely a century ago biologists were debating whether or not genes actually existed. In his 1909 textbook on heredity, Danish botanist Wilhelm Johannsen coined the term gene to refer to that hereditary “something” that influences the traits of an organism, but without making a commitment to any hypothesis about what that “something” was. Just over a decade later, a prominent biologist could still note that some people viewed genes as “a convenient fiction or algebraic symbolism.”
As the century progressed, biologists came to see genes as real physical objects. They discovered that genes have a definite size, that they are linearly arrayed on chromosomes, that individual genes are responsible for specific chemical events in the cell, and that they are made of DNA and written in the language of the Genetic Code. By the time the Human Genome Project was initiated in 1988, researchers knew that a gene was a segment of DNA with a clear beginning and end and that it acted by directing the production of a particular enzyme or other molecule that did a specific job in the cell. As real things, genes are countable, and in 1999 biologists estimated that humans had “80,000 or so” of them.
Yet, when the dust from the Human Genome Project cleared, we didn’t have nearly so many genes as we thought. By the latest count, we have 20,805 conventional genes that encode enzymes and other proteins. Our inflated gene count, though, wasn’t the only casualty of the Human Genome Project. The very idea of a gene as a well-defined segment of DNA with a clear functional role has also taken a hit, and as a result, our understanding of our relationship with our genes is changing.
One major challenge to the concept of a gene is the growing evidence that many genes are shapeshifters. Instead of a well-defined segment of DNA that encodes a single protein with a clear function, we should view a gene as “a polyfunctional entity that assumes different forms under different cellular states,” according to University of Washington biologist John Stamatoyannopoulos. While researchers have long known that genes are made up of discrete subunits called “exons,” they hadn’t realized until recently the degree to which exons are assembled—like Legos—into sometimes thousands of different combinations. With new technologies, biologists are cataloging these various combinations, but in most cases they don’t know whether those combinations all serve the same function, different functions, or no function at all.
Our concept of a gene is also challenged by the fact that . . .
Pretty cool how new genes can arise: a start sequence is inserted into the middle of some junk DNA, and presto! a new gene—which may be toxic, or neutral, or even helpful. Carl Zimmer has a useful explanatory article in the NY Times.
Brandon Keim reports at Wired:
One of agricultural biotechnology’s great success stories may become a cautionary tale of how short-sighted mismanagement can squander the benefits of genetic modification.
After years of predicting it would happen — and after years of having their suggestions largely ignored by companies, farmers and regulators — scientists have documented the rapid evolution of corn rootworms that are resistant to Bt corn.
Until Bt corn was genetically altered to be poisonous to the pests, rootworms used to cause billions of dollars in damage to U.S. crops. Named for the pesticidal toxin-producing Bacillus thuringiensis gene it contains, Bt corn now accounts for three-quarters of the U.S. corn crop. The vulnerability of this corn could be disastrous for farmers and the environment.
“Unless management practices change, it’s only going to get worse,” said Aaron Gassmann, an Iowa State University entomologist and co-author of a March 17 Proceedings of the National Academy of Sciences study describing rootworm resistance. “There needs to be a fundamental change in how the technology is used.”
First planted in 1996, Bt corn quickly became hugely popular among U.S. farmers. Within a few years, populations of rootworms and corn borers, another common corn pest, had plummeted across the midwest. Yields rose and farmers reduced their use of conventional insecticides that cause more ecological damage than the Bt toxin.
By the turn of the millennium, however, scientists who study the evolution of insecticide resistance were warning of imminent problems. Any rootworm that could survive Bt exposures would have a wide-open field in which to reproduce; unless the crop was carefully managed, resistance would quickly emerge.
Key to effective management, said the scientists, were refuges set aside and planted with non-Bt corn. Within these fields, rootworms would remain susceptible to the Bt toxin. By mating with any Bt-resistant worms that chanced to evolve in neighboring fields, they’d prevent resistance from building up in the gene pool.
But the scientists’ own recommendations — an advisory panel convened in 2002 by the EPA suggested that a full 50 percent of each corn farmer’s fields be devoted to these non-Bt refuges — were resisted by seed companies and eventually the EPA itself, which set voluntary refuge guidelines at between 5 and 20 percent. Many farmers didn’t even follow those recommendations. . .
UPDATE: It occurs to me that a great amount of corn is grown across the Bible Belt, where people often reject evolution in favor of Intelligent Design, with God taking the role of species-maker. What are those people going to do with this new variety of Bt-resistant of rootworm? So far as I can see, they can view the evidence from one of two perspectives: evolution naturally has occurred, with the Bt resistant mutation having significant survival advantages, etc. Or, it is equally consistent with God stepping in to make rootworms Bt-resistant, presumably in furtherance of a divine plan. Either interpretation fits the observations.
When you disagree with someone about evolution, global warming, vaccines, or the like, I believe that you’re likely to encounter a way of thinking that is sufficiently foreign to me that I just now figured out what might be going on. What I have experienced in such arguments has convinced me that some people view a strong belief as in itself evidence that the belief is true (presumably because “if it wasn’t true, I wouldn’t believe it so strongly—duh!”). In other words, belief is treated as though it were evidence, and the intensity of the belief measures the evidence for it: intense belief equals strong evidence, just by itself.
When you try to argue against such a belief, you probably usewhat we normally think of as evidence, namely facts. You then run into another problem. The person who views beliefs as constituting evidence for the beliefs also views facts as opinions. Thus when you point out a fact that contradicts their belief (for which they have loads of evidence, in their sense: that is, they believe it strongly), a common response is, “That’s (just) your opinion.” That is, just as they weigh beliefs as we normally weigh evidence, actual evidence—that is, verifiable facts—is weighed as we normally weigh opinions: an opinion being something that’s perfectly fine for you to accept, but really has nothing whatsoever to do with whether I accept it—that is, whether it is also my opinion/fact. Just as someone can have an opinion on something without affecting my own opinion on the same thing, so the facts you present (which are viewed as merely your opinion) don’t really effect what the other believes. Daniel Moynihan specifically warned that, while you are entitled to your own opinions, you are not entitled to your own facts, and that was not an empty warning: some, I think, do view facts as opinions (as shown by their reasoning).
That does seem to describe what happens and shows why the arguments go nowhere: the rational person has been offering something that simply has no weight for the believer—the rational one thinks he’s offering evidence, but the believer views him as offering opinion, and of course his opinion is beside the point: “I have my own opinions.”
So: the question becomes, what does have weight for the believer and thus triggers a change in view? It may be couching ideas in terms the believer already accepts: e.g., “I say to you in the name of Jesus our Lord and Savior, send a donation now.” The demand for money is accepted because of the accompanying incantations from the belief system: the system passwords, in effect. And as we’ve seen from a long string of huckstering ministers, those incantations actually work: when the ministers demand money, they tie in salvation, and so it sounds like a pretty good deal: something real and of paramount importance (salvation) for mere money. I recall that Oral Roberts once advised his radio audience that God was going to take him if his listeners didn’t contribute $44 million before some date. (I believe this may have been for Oral Roberts University.) The listeners came through (or at least the Rev. Roberts said that they did, and it’s certainly true that he did not die at the time, which sort of proves it). The response seems a little odd given that the penalty—God taking Oral Roberts into His Kingdom and Arms—actually sounds like exactly what Roberts claims to want and has been working toward.
At any rate, perhaps we must cast our case for evolution, global warming, and vaccines in theological terms—invoking the name of our Savior liberally, but also sticking with the facts: rational Christianity, in effect. And isn’t that exactly what the Moral Mondays in North Carolina are all about? Aren’t the Moral Mondays an effort to get people to look at recent public policy and legislation and view the effects in religious terms. This seems natural enough: it’s what Jesus Himself did when facing in His time circumstances similar in some ways to the US today: helping and caring for the poor and humble—and, you will recall, He condemned wealth harshly. In effect, He was head of the Occupy Jerusalem Movement. And He suffered for it, as is often the case for those who try to help the poor and humble and protect them from the wealthy and powerful.
So it’s been done before. That indicates it might work.
Lamarck rides again: the inheritability of acquired traits—epigenetics—continues to be a field of study. Michael White has a good article on the topic in Pacific Standard:
You inherit some of your grandmother’s genes, but do you also inherit her experiences? Last month, a group of Swedish researchers at the Karolinska Institute and the University of Umeå released their latest in a series of reports on a long-term health study of people born between 1890 and 1920 in Överkalix, a small town in northern Sweden. These scientists have been making waves for more than a decade with claims that our health is influenced by the experiences of our grandparents. Using statistics of 19th-century harvests in Northern Sweden to determine how much food was available to the ancestors of the residents of Överkalix, the researchers concluded that the risk for cardiovascular disease among their study participants was influenced by the dramatic swings from feast to famine experienced by the participants’ grandparents during childhood.
Studies like this one are part of the hodgepodge of research that gets lumped together in the growing and increasingly ill-defined field of epigenetics. Scientific interest in this field has boomed over the past decade, thanks in part to technological advances that make new types of experiments possible. Epigenetics is hot in the popular press as well. It made the cover of Time in 2010. In Germany, Der Spiegel declared that epigenetics is a “victory over the gene,” illustrating both the victory itself and the sexiness of the science with an image of naked woman emerging from the confines of her gene pool. And of course, epigenetics is touted as the new secret to curing cancer. But the popularity of epigenetics is misplaced: It’s a badly over-hyped field whose recent findings aren’t nearly as revolutionary as many of its practitioners believe.
What is epigenetics? The term was coined in 1942 by British biologist Conrad Waddington, as a name for the study of how genes produce a fully developed organism from a single fertilized egg. More recently, epigenetics is commonly defined as the study of processes that transcend our genes—processes that produce different outcomes from a single, fixed set of genes by controlling how those genes get used. A classic example is a strain of laboratory mice that carry a particular version of a gene that can cause two different coat colors. Two mice may carry identical versions of this gene but nevertheless appear very different, with either light or dark fur, depending on its epigenetic state—whether the gene is in an on or off state.
What is surprising about this gene is that its epigenetic state can be passed from parents to offspring. This is not supposed to happen. Biological development depends on a complex process of switching the epigenetic states of tens of thousands of genes, as a newly fertilized egg gradually transforms itself into an adult organism made up of trillions of cells that come in hundreds of different types. In order for all of this epigenetic switching to happen correctly, genes must reset to a default state at conception. But researchers are finding evidence of exceptions, genes that somehow escape the reset. If states of genes can be inherited, and not just the genes themselves, then life experiences that happen to alter the epigenetic state of your grandparents’ genes can be passed on to you.
Recently developed technologies make it possible for scientists to measure the epigenetic states of genes more comprehensively than ever before. The result is a wave of studies showing how our life experiences impact the state of our genes. These studies are fascinating, but the results shouldn’t be particularly surprising to anyone familiar with the last 60 years of molecular biology. DNA is not a static biological blueprint. Much as your body adapts to a high-altitude environment by ramping up the production of red blood cells, our genomes respond to environmental signals by changing which genes get expressed. This is not news. A trio of French scientists won a Nobel Prize in 1965 for showing how this process works in bacteria, and there have been decades of studies of this phenomenon in humans.
So we shouldn’t be surprised that our physical and even our mental environment can influence the behavior of our genes in many different cells in our body. Epigenetic states of genes are manifestations of DNA doing its job, and many scientists are currently focused on trying to understand what those different epigenetic states mean. This research is important and interesting, but not paradigm-breaking.
The question of inherited epigenetic states is another matter. . .
There’s a lot more to viruses than we thought. Didier Raoult reports in The Scientist:
The theory of evolution was first proposed based on visual observations of animals and plants. Then, in the latter half of the 19th century, the invention of the modern optical microscope helped scientists begin to systematically explore the vast world of previously invisible organisms, dubbed “microbes” by the late, great Louis Pasteur, and led to a rethinking of the classification of living things.
In the mid-1970s, based on the analysis of the ribosomal genes of these organisms, Carl Woese and others proposed a classification that divided living organisms into three domains: eukaryotes, bacteria, and archaea. (See “Discovering Archaea, 1977,” The Scientist, March 2014) Even though viruses were by that time visible using electron microscopes, they were left off the tree of life because they did not possess the ribosomal genes typically used in phylogenetic analyses. And viruses are still largely considered to be nonliving biomolecules—a characterization spurred, in part, by the work of 1946 Nobel laureate Wendell Meredith Stanley, who in 1935 succeeded in crystallizing the tobacco mosaic virus. Even after crystallization, the virus maintained its biological properties, such as its ability to infect cells, suggesting to Stanley that the virus could not be truly alive.
Recently, however, the discovery of numerous giant virus species—with dimensions and genome sizes that rival those of many microbes—has challenged these views. (See illustration.) In 2003, my colleagues and I announced the discovery of Mimivirus, a parasite of amoebae that researchers had for years considered a bacterium.1 With a diameter of 0.4 micrometers (μm) and a 1.2-megabase-pair DNA genome, the virus defied the predominant notion that viruses could never exceed 0.2 μm. Since then, a number of other startlingly large viruses have been discovered, most recently two Pandoraviruses in July 2013, also inside amoebas. Those viruses harbor genomes of 1.9 million and 2.5 million bases, and for more than 15 years had been considered parasitic eukaryotes that infected amoebas.2
Now, with the advent of whole-genome sequencing, researchers are beginning to realize that most organisms are in fact chimeras containing genes from many different sources—eukaryotic, prokaryotic, and viral alike—leading us to rethink evolution, especially the extent of gene flow between the visible and microscopic worlds. Genomic analysis has, for example, suggested that eukaryotes are the result of ancient interactions between bacteria and archaea. In this context, viruses are becoming more widely recognized as shuttles of genetic material, with metagenomic studies suggesting that the billions of viruses on Earth harbor more genetic information than the rest of the living world combined. (See “Going Viral,” The Scientist, September 2013.) These studies point to viruses being at least as critical in the evolution of life as all the other organisms on Earth.
A giant discovery
Despite the fact that viruses use the same genetic code as verifiably living things, science long classified them as mere collections of biomolecules. And because scientists assumed that viruses had both an upper size limit of just 0.2 μm and a parasitic nature, they classified them in a not-quite-biological world of their own.
That thinking started to change in the early 2000s, when my colleagues and I identified an unknown virus living inside an amoeba. It was as big as some bacteria and archaea and was visible under an optical microscope—qualifying it as a microbe under Pasteur’s original definition. I named it Mimivirus as a personal joke about the stories that my father, a biomedical scientist, told me when I was a child to explain evolution; the stories were based on the life of “Mimi the amoeba.” I initially disguised the true source of this name, however, pretending that Mimivirus came from “MiMicking microbe.”
Researchers had first noticed Mimivirus in 1992, but based on its appearance under light microscopy it had been considered an intracellular bacterium for several years. Transmission electron microscopy images depicting its ultrastructure, along with the determination of its genome sequence in 2004,3however, confirmed that it was, in fact, part of the viral world. Mimivirus has no ribosomal genes, but its genome contains more than 1,200 genes—three times more than any virus known at the time. Its genome is larger than that of many bacteria and archaea and comparable to some eukaryotic genomes. Mimivirus was no ordinary virus.
Unlike most other viruses, Mimivirus carries genes that encode . . .
Fascinating article at Pacific Standard by Ethan Watters:
One morning last fall, the evolutionary biologist Randy Thornhill was standing with me in front of the gorilla enclosure at the Albuquerque zoo. He was explaining a new theory about the origins of human culture when Mashudu, a 10-year-old western lowland gorilla, decided to help illustrate a point. In a very deliberate way, Mashudu sauntered over to the deep cement ravine at the front of his enclosure, perched his rear end over the edge, and did his morning business.
Mashudu, I suspected, had just displayed what evolutionary theorists call a “behavioral immune response”—a concept central to Thornhill’s big theory. So I asked him whether I was right about Mashudu. “Pooping downhill is pretty smart,” Thornhill said after some consideration. “He got his waste as far away from him as possible. I think that would probably count as a disease avoidance behavior.”
It might seem strange to fixate on how a gorilla goes about answering the call of nature. But according to Thornhill’s hypothesis, much of what we humans like to think of as politics, morality, and culture is motivated by the same kind of subconscious instinct that likely drove Mashudu to that ledge.
Anyone with a basic grasp of biology knows that all animals have immune systems that battle pathogens—be they viruses, bacteria, parasites, or fungi—on the cellular level. And it’s also fairly well understood that animals sometimes exhibit outward behaviors that serve to ward off disease. Just around the corner from the fastidious Mashudu, Thornhill and I watched an orangutan named Sarah grooming her six-month-old son Pixel, poring through his hair for parasites. Some species of primate, Thornhill told me, will ostracize sick members of the group to avoid the spread of disease. Cows and other ungulates are known to rotate their movements among pastures in such a way as to avoid the larvae of intestinal worms that hatch in their waste. And in ant societies, only a small number of workers are given the task of hauling away the dead, while sick ants will sometimes leave the nest to die apart from the group.
At the most quotidian level, Thornhill finds it easy to convince people that humans likewise manifest such instinctual behaviors to avoid infection and illness. Some of these habits very much parallel those seen in other creatures. I admitted to Thornhill that I had recently been displaying a bit of grooming behavior myself after the youngest primate in my care came home from preschool itching with head lice. Like Mashudu, we humans remove waste from our living quarters. We ostracize our sick, at least to the extent that we expect those with the flu to stay home from work or school. And similar to the lowly ant, we assign a small number of our fellows the solemn duty of hauling away and disposing of our dead. On examination, everyday life is full of small defensive moves against contamination, some motivated by feelings, like disgust, that arise without conscious reflection. When you open the door of a gas station bathroom only to decide you can hold it for a few more miles, or when you put as much distance as possible between yourself and a person who is coughing and sneezing in a waiting room, you are displaying a behavioral immune response.
But these individual actions are just the tip of the iceberg, according to Thornhill and a growing camp of evolutionary theorists. Our moment-to-moment psychological reactions to the threat of illness, they suggest, have a huge cumulative effect on culture. Not only that—and here’s where Thornhill’s theory really starts to fire the imagination—these deep interactions between local pathogens and human social evolution may explain many of the basic differences we observe between cultures. How does your culture behave toward strangers? What kind of government do you live under? Who are your sexual partners? What values do you share? All of these questions may mask a more fundamental one: What germs are you warding off?
The threat of disease is not uniform around the world. In general, higher, colder, and drier regions have fewer infectious diseases than warmer, wetter climates. To survive, people in this latter sort of terrain must withstand a higher degree of “pathogen stress.” Thornhill and his colleagues theorize that, over time, the pathogen stress endemic to a place tends to steer a culture in distinct ways. Research has long shown that people in tropical climates with high pathogen loads, for example, are more likely to develop a taste for spicy food, because certain compounds in these foods have antimicrobial properties. They are also prone to value physical attractiveness—a signal of health and “immunocompetence,” according to evolutionary theorists—more highly in mates than people living in cooler latitudes do. But the implications don’t stop there. According to the “pathogen stress theory of values,” the evolutionary case that Thornhill and his colleagues have put forward, our behavioral immune systems—our group responses to local disease threats—play a decisive role in shaping our various political systems, religions, and shared moral views.
If they are right, Thornhill and his colleagues may be on their way to unlocking some of the most stubborn mysteries of human behavior. Their theory may help explain why authoritarian governments tend to persist in certain latitudes while democracies rise in others; why some cultures are xenophobic and others are relatively open to strangers; why certain peoples value equality and individuality while others prize hierarchical structures and strict adherence to tradition. What’s more, their work may offer a clear insight into how societies change. According to Thornhill’s findings, striking at the root of infectious disease threats is by far the most effective form of social engineering available to any would-be reformer.
If you were looking for a paradigm-shifting theory about human behavior, . . .
Interesting note by Kerry Grens in The Scientist on a long-running experiment in evolution:
In 1988, when evolutionary biologist Richard Lenski was an assistant professor at the University of California, Irvine, he started a simple experiment: toss E. coli into a new environment and watch what happens. He wanted to know how reproducible evolution would be, so he put the same strain of the bacteria into 12 flasks with the same simple medium and waited to see how they would evolve. E. coli normally lives in the guts of animals, so the experiment would allow for a way to observe adaptations to a new environment.
After about a year and 2,000 E. coli generations, Lenski and his colleagues published the first results of what they then considered to be a long-term experiment in evolution. Little did they know that 25 years and 50,000 generations later, the experiment would still be chugging along—those 12 flasks representing alternate universes of bacterial existence. “I guess I didn’t view it as [being as] open-ended as it clearly has become, not only as an experiment but in terms of the ability of the organisms to keep improving,” says Lenski.
In his latest publication on the experiment, Lenski reported that the bacteria continually become more fit. His team pitted bacteria from various evolutionary time points (from each flask, a sample is frozen every 500 generations) against one another to see which would grow better when combined in the same container. “I like to think of this project as time travel because we’re comparing organisms that lived at different points in the past, resurrecting them, and comparing them head to head,” says Lenski.
Lenski and his collaborators can distinguish the competitor populations from different flasks because of color-coded genetic markers. For instance, they would pit a sample taken from one flask of red bacteria at 50,000 generations against an ancestral sample from another flask housing white bacteria. To make sure the resurrected bacteria weren’t at a dis-advantage, they would give the organisms time to acclimate after being thawed. Lenski’s team found that bacteria that had evolved for a greater length of time—those from later generations—appeared more fit than those resurrected from earlier generations; fitness never peaked (Science, 342:1364-67, 2013). Their data suggest that, at least in this situation, evolutionary fitness is ever increasing.
Rees Kassen of the University of Ottawa says the most interesting finding is that most adaptations happened early on in the experiment. “That means [initially] there are lots of opportunities for [the bacteria] to get better,” he says. “Even though beneficial mutations are still very rare events, there are still different ways they could get better, and they also are likely to improve fitness by a large amount.”
The results came as a surprise to Lenski, who expected fitness to plateau. It’s not the first time his bacterial cells have proven unpredictable, such as when they began to utilize a new food source. In 2008, one of the strains evolved to metabolize citrate, which is ordinarily just a buffer in the medium. “It was a quantum leap in the evolution of this species, and it was totally unexpected,” says Tadeusz Kawecki of the University of Lausanne.
John Thompson, an evolutionary biologist at the University of California, Santa Cruz, says that the results show there are many adaptive solutions, even in a simple environment. “It is, then, no wonder that life has evolved to be so diverse,” Thompson writes in an e-mail. “That does not mean, though, that all populations in nature will always continually evolve increases in adaptation.” In cases where the environment is changing rapidly, for instance, slow increases in fitness will not be able to continue.
The finding contradicts the “naive” view that an organism will cease getting fitter once it’s well adapted to an environment, says Kassen. Without Lenski’s experiment, there wouldn’t be much empirical data to show that. “The fact of the matter is, it’s the only experiment we can test,” he says. “No other experiments have gone on as long.” . . .
Evolved systems, whether natural or electronic, tend to exploit every aspect of the environment and the entire range of possibilities: a blind watchmaker is not limited by what he sees, after all. He’ll try everything, whether it makes sense or not.
Consider the evolved system described in this paper: a circuit that was evolved to distinguish between two signals—”Using only 100 logic cells, evolution had to come up with a circuit that could discriminate between two tones, one at 1 kilohertz and the other at 10 kilohertz.” The experiment succeeded, as described in the paper, but there are mysteries:
One thing is certain: the FPGA is working in an analogue manner. Up until the final version, the circuits were producing analogue waveforms, not the neat digital outputs of 0 volts and 5 volts. Thompson says the feedback loops in the final circuit are unlikely to sustain the 0 and 1 logic levels of a digital circuit. “Evolution has been free to explore the full repertoire of behaviours available from the silicon resources,” says Thompson.
That repertoire turns out to be more intriguing than Thompson could have imagined. Although the configuration program specified tasks for all 100 cells, it transpired that only 32 were essential to the circuit’s operation. Thompson could bypass the other cells without affecting it. A further five cells appeared to serve no logical purpose at all–there was no route of connections by which they could influence the output. And yet if he disconnected them, the circuit stopped working.
It appears that evolution made use of some physical property of these cells–possibly a capacitive effect or electromagnetic inductance–to influence a signal passing nearby. Somehow, it seized on this subtle effect and incorporated it into the solution.
Similarly, evolved creatures present systems with some surprises. Laasya Samhita writes at The Scientist:
Vitamin A deficiency is associated with several health problems including night blindness and increased asthma risk. And as with other nutritional deficiencies, it is also known to compromise adaptive immunity mediated by the specialized T cells of the immune system. So it came as a surprise when researchers found that vitamin A deficiency could also activate the immune system and help protect mice against worm infections. Their work was published in Science today (January 23).
Yasmine Belkaid from the National Institute of Allergy and Infectious Diseases and her colleagues were examining the effects of vitamin A deficiency on the intestinal populations of two types of mouse innate immune cells—innate lymphoid cells 2 and 3 (ILC2, ILC3), which play major roles in maintaining “barrier immunity,” the first line of defense at surfaces exposed to the environment, such as the intestine and skin—when they found this unexpected result. When the researchers blocked the active metabolite of vitamin A, retinoic acid, ILC3 cell populations diminished as expected. But ILC2 cell populations swelled.
Describing the work as “an elegant series of experiments,” Richard Grencis, a professor of immunology and microbiology at the University of Manchester who was not involved in the work, told The Scientist in an e-mail that “a traditional hallmark of an innate immune response is that it does not change—does not adapt—regardless of how many times it is activated. This work clearly challenges this view, but with a novel slant.”
Tracy Vence writes at The Scientist:
Most adult mammals can’t digest milk, but humans have evolved lactase persistence—the continued activity of the lactose-digesting enzyme lactase—several times, independently and in different parts of the world, during the last 10,000 years. Evolutionary biologists have proposed several theories to explain the strong selection for lactase persistence among dairy-consuming populations, but to date, none have provided definitive reasoning for the evolution of such an unusual digestive ability.
Analyzing ancient DNA from the skeletal remains of eight late Neolithic Iberian people, scientists now present evidence to suggest that one such hypothesis—that lactase persistence was selected for among early northern Europeans to allow people to drink milk to avoid calcium and vitamin D deficiencies—could not alone explain the rapid rise in lactase persistence across the continent. Their work was published in Molecular Biology and Evolution today (January 22).
“It is likely that the advantage provided to lactase-persistent individuals was not constant throughout time and space,” University College London’s Pascale Gerbault, whose own work focuses on the spread of lactase persistence in Europe but was not involved in the work, told The Scientist in an e-mail. The new study “may provide some support for this possibility,” she added.
Uppsala University’s Oddný Sverrisdóttir and her colleagues tested the so-called calcium assimilation hypothesis by analyzing ancient and modern DNA from individuals who lived in the same area of high sun exposure. UVB light aids in the synthesis of vitamin D, which in turn supports the absorption of calcium, so these people were unlikely to have suffered poor vitamin D and calcium status. The researchers found that none of the eight ancient individuals carried the known European lactase persistence-associated mutation, called -13,910*T. Modern sequence samples from the same sunny region, however, revealed that more than 30 percent of the population do carry -13,910*T, suggesting that at some point in the last 7,000 years, the population acquired lactase persistence despite having sufficient vitamin D and calcium.
“When I saw that the ancient Spaniards had zero lactase persistence, yet one-third of the contemporary Spanish population has lactase persistence now, I just started wondering,” said Sverrisdóttir.
So, if not to stave off calcium and vitamin D deficiencies, why might this population have acquired lactase persistence? Sverrisdóttir and her colleagues propose that during crop failure-induced famines, dairy may have been the only reliable source of sustainable nutrition. Those who were lactose intolerant may have fallen ill from consuming milk products, or starved. “We’re talking about the difference between life and death,” Sverrisdóttir said.
Alternatively, Gerbault offered, “it may be the case that migrants from neighboring populations where the allele frequency was higher” brought the trait to the Iberian people.
Gerbault noted that, if anything, the current study raises more questions than it seems to answer. “We know there are different alleles associated with lactase persistence, but we don’t quite know what the selective pressures [were] that provided lactase persistence with a selective advantage,” she said.
Still, Sverrisdóttir said her team’s results sufficiently cast serious doubt on the calcium assimilation hypothesis for lactase persistence in Europe. Evolutionary hypotheses “are so fascinating to coin, but it can be hard to say if it happened or didn’t,” Sverrisdóttir agreed.
O. Ó. Sverrisdóttir et al., “Direct estimates of natural selection in Iberia indicate calcium absorption was not the only driver of lactase persistence in Europe,” Molecular Biology and Evolution, doi:10.1093/molbev/msu049, 2014.
Jef Akst has an interesting note at The Scientist:
Helicobacter pylori is a widespread bacterium that colonizes the gut mucosa in nearly half the human population, causing gastric inflammation and, in a small percentage of patients, stomach cancer—the second leading cause of cancer-related deaths worldwide. But the prevalence of H. pylori infections do not correlate with cancer incidence, suggesting other factors are at play. In a study published today (January 13) in PNAS, researchers provide evidence that those other factors include the ancestry of both the host and the pathogen: patients that are infected with H. pylori strains that have a distinct ancestry from their own are more likely to suffer severe disease.
“For the first time, [this study] suggests that we have to take the ancestry of both host and microbe into the equation,” said Emad El-Omar, a gastroenterologist and cancer biologist at the University of Aberdeen, who was not involved in the work. “We can’t just look at one or the other.”
The research is the latest study of two Colombian populations that have served as poster children for the study of gastric cancer. A coastal population of primarily African ancestry has a relatively low incidence of the disease as compared with a population of largely Amerindian descent in the Andes Mountains just 200 kilometers away. For years, pathologist and native Colombian Pelayo Correa, a pioneer of gastric cancer research, has puzzled over this discrepancy.
Curious about the role of coevolution between these populations and their pathogens, Correa, along with molecular biologist Barbara Schneider of Vanderbilt University Medical Center and colleagues, decided to take a closer look at the H. pylori strains afflicting patients in these regions. As Colombian patients came into the local hospitals requiring endoscopies, the researchers would ask for volunteers to donate a tissue sample. The samples were then shipped on dry ice to Schneider’s Vanderbilt lab, where her team cultured and analyzed the bacteria within. In the end, the researchers found that while all H. pylori sampled showed evidence of multiple ancestries, those in the coastal region, with a low incidence of stomach cancer, were dominated by ancestral African makeup, just like their human hosts. Those in the mountain region, on the other hand, where gastric cancer is more common, appeared to be more closely related to H. pylori of southern Europe, unlike the predominately Amerindian human population. The results suggested that a shared evolutionary history of humans and bacteria resulted in a less virulent host-pathogen relationship.
“[It’s] fascinating,” said El-Omar. “If you have African strains affecting African-ancestry hosts, it doesn’t cause too much damage, whereas if you’ve got African-origins strains infecting Amerindians up in the mountains, that’s when you get most precancerous changes. So it looks like if you’ve coevolved with your strains, you get less and less virulence.”
Looking more specifically at the relationship between individual patients and their H. pylori infections was even more telling: “The more Amerindian ancestry you see in people and the more African components in the Helicobacter strain, the more likely the person is to have severe gastric lesions,” said Schneider.
The interaction between host and pathogen ancestry was strong, having an effect “five times higher than the effect of cagA,” or cytotoxin-associated gene A, which “is recognized so far as the most virulent H. pylori factor in the gastric cancer process,” Carlos González, an epidemiologist Catalan Institute of Oncology in Barcelona, Spain, who was not involved in the research, said in an e-mail.
Of course, El-Omar added, there are other relevant factors to consider, including the presence of harmless worm infections—which dampen inflammatory responses—as well as iron levels and diet. “This story keeps getting more and more interesting and more and more complex,” he says. “It will require [more] studies to unfold the whole story.”
N. Kodaman et al., “Human and Helicobacter pylori coevolution shapes the risk of gastric disease,” PNAS, doi/10.1073/pnas.1318093111, 2014.
Fascinating article on a retake on the theory of evolution.
In the NY Times Mark Bittman provides an example:
That “good” news you may have read last week about the Food and Drug Administration’s curbing antibiotics in animal feed may not be so good after all. In fact, it appears that the F.D.A. has once again refused to do all it could to protect public health.
For those who missed it, the agency requested (and “requested” is the right word) that the pharmaceutical industry make a labeling change that, the F.D.A. says, will reduce the routine use of antibiotics in animal production. I’d happily be proven wrong, but I don’t think it will. Rather, I think we’re looking at an industry-friendly response to the public health emergency of diseases caused by antibiotic-resistant bacteria, resistance that is bred in industrially raised animals.
You may know that around 80 percent of antibiotics in the United States are given (fed, mostly) to animals. Why? Because the terrible conditions in which most of our animals are grown foster illness; give them antibiotics and illness is less likely. There is also a belief that “subtherapeutic” doses of antibiotics help animals grow faster. So most “farmers” who raise animals by the tens or hundreds of thousands find it easier to feed them antibiotics than to raise them in ways that allow antibiotics to be reserved for actual illness. (And yes, there are alternatives, even in industrial settings. Denmark raises as many hogs as Iowa and does it with far fewer antibiotics.)
You may also know that this overuse of antibiotics is leading to increasing bacterial resistance, that we’re breeding an army of supergerms. This isn’t theoretical: . . .
The government is again sticking its head in the sand, just as with climate change, only more so. We know the problem, we know the cause, we know how to fix it. But we do nothing.
The book is Daniel Dennett’s Darwin’s Dangerous Idea. Yeah, yeah. I knew it had won a lot of accolades including National Book Award finalist, but still… somehow, as I was reading it, I couldn’t stop thinking, “Wow! This is really good!” So I’m telling you: read just the first 60 pages and see if you don’t get a strong sense of cyberpunk science fiction. Let me know.
A clear example of how businesses in general simply do not care about you health, and how Congress really doesn’t either. Melinda Henneberger reports in the Washington Post:
The farm and pharmaceutical lobbies have blocked all meaningful efforts to reduce the use of antibiotics in raising livestock in American, a practice that poses major public health risks, a study released Tuesday found.The report says Congress has killed every effort to legislate a ban on feeding farm animals antibiotics that are important in human medicine. Not only that, but regulation of livestock feeding practices has grown weaker under the Obama administration, the study says. “Our worst fears were confirmed,’’ said Bob Martin, executive director of the Johns Hopkins Center for a Livable Future, which issued the report.The Food and Drug Administration’s own statistics, he said, show that fully 80 percent of the antibiotics sold in this country are fed to food animals.
The study comes five years after a troubling report on the way livestock is produced, written by a Pew Charitable Trusts commission of top scientists and ethicists working through the Johns Hopkins Bloomberg School of Public Health. That landmark study warned that industrial farms that are feeding animals antibiotics for breakfast, lunch and dinner are plumping them up at a terrible cost, making antibiotics ever-less effective in treating human disease as microbes grew more resistant.
FDA guidelines in the pipeline now, Martin said, would require the industry to stop using antibiotics specifically to bulk up cows and other food animals, but would continue to allow their use for “disease-control.” What constitutes disease-control is so loosely defined, however, that there would be “no change” in the use of antibiotics as a result, Martin said.
“In a couple of areas the Obama administration started off with good intentions, but when industry pushed back, even weaker rules were issued,” he said. “We saw undue influence everywhere we turned.”
In a response via e-mail Monday evening, an FDA spokeswoman wrote that . . .
Interesting how they proved him to be the culprit, as reported in The Scientist by Chris Palmer:
Something was amiss in the Spanish coastal city of Valencia. A dozen cases of hepatitis C, a potentially fatal blood-borne viral infection that causes cirrhosis of the liver, had turned up within a short time span in early 1998. As more cases popped up over the ensuing weeks, one fact linked virtually all the cases: the patients had at one time or another been admitted to one of two local hospitals.
Valencian public health department officials set up a committee of local scientists and epidemiologists to get a handle on the outbreak. One tool the health department planned to use to identify the source of the infections was a genetic analysis that was just starting to be employed in court cases related to HIV transmission. The forensic tool, based on the principles of molecular phylogenetics, could help infer the most recent common ancestor of virus strains from any two people based on the estimated rate of accumulated viral mutations.
Because of his experience in molecular biology, Fernando González-Candelas, an evolutionary geneticist at the University of Valencia, was tapped to head the health department’s phylogenetic testing. As the investigation expanded, the number of possible cases of infection soared into the hundreds. “We had no idea when we were contacted that it was going to be such a big and complicated problem that it turned out to be,” says González-Candelas. Ultimately, 275 people—almost all of them patients at one or both of two hospitals in Valencia—were determined to be victims of the outbreak, which stretched back to 1988.
When the Valencian provincial court learned of the health department’s scientific committee, it asked to use the findings of the phylogenetic analysis as evidence for a criminal case against Juan Maeso, an anesthetist who worked regularly at the two hospitals (and occasionally at others) and who had administered painkillers intravenously to all of the known hepatitis C patients following surgical procedures.
González-Candelas and his team spent the next 2 years comparing 4,000 sequences of the hepatitis C virus (HCV) genome from 322 patients who had contracted HCV during Maeso’s tenure to more than 100 genome sequences from 28 HCV haplotypes that Maeso carried.
But virus genomes evolve rapidly—about one million times faster than the human genome. “There is a race between the virus and the immune system, with one trying to control the other and the other trying to escape,” says González-Candelas.
This means that viral sequences from the source and even a recently infected individual are almost never identical, according to Anne-Mieke Vandamme, an epidemiological virologist at Katholieke Universiteit Leuven in Belgium who was not involved in the research. However, the rate at which mutations accumulate is relatively constant, so recently infected individuals should have viruses with higher sequence similarity to the source than those infected in the distant past. . .
Very interesting article in Quanta by Peter Byrne:
The developmental biologist Cassandra Extavour sings classical and baroque music on stage with the Handel and Haydn Society at Symphony Hall in Boston. Blessed with a beautiful soprano voice, she could easily have chosen to pursue a career as a singer. But a summer working in a developmental genetics laboratory as an undergraduate tipped the scales in favor of science, and Extavour is now an associate professor of organismic and evolutionary biology at Harvard University. By choosing biology, she says, she has been able to pursue her musical career part time; a full-time concert soprano, by contrast, would not have had the time to run a lab on the side.
Extavour directs a national research collaborative called EDEN, which stands for Evo-Devo-Eco (evolutionary-developmental-ecological) Network. The organization,funded by the National Science Foundation, encourages geneticists to dissect more exotic creatures than the ubiquitous fruit fly, Drosophila melanogaster. EDEN researchers model the various evolutionary paths of sea anemones, horseshoe crabs, mosses, crickets, spiders, milkweed bugs and the super-hardy tardigrade. Extavour’s own lab focuses on dissecting insect embryos and ovaries, searching for genetic clues to the origin of multicellularity and the complex organisms that multicellularity made possible, including Homo sapiens. Extavour’s special expertise is in tracking the development of germ cells, the cells created in an embryo that contain the genetic code for reproducing multicellular organisms.
Last winter, Extavour was one of the organizers of a ten-week program on a controversial topic, “Cooperation and the Evolution of Multicellularity,” at the Kavli Institute for Theoretical Physics at the University of California, Santa Barbara. The daily talk sessions were attended by scores of highly regarded scientists from all over the world, including developmental and evolutionary biologists, mathematical physicists, and zoologists — plus an embedded journalist reporting for Quanta Magazine. The event was unusual because of its prolonged and multidisciplinary nature. And many exchanges were heated because, despite 150 years of research on the biology of evolution, scientists still disagree about how and why multicellular creatures and plants emerged from ancient oceans that teemed with robust and self-reliant single-celled entities.
At the conference, biologists who work mostly in the field observing the behaviors of bees, ants, wolves, slime molds and other creatures tended to look for the mechanics of natural selection at the behavioral level by examining how individual organisms self-organize into hives, nests, packs, conglomerates or families. Physicists and molecular biologists focused more on the micromechanics of natural selection at the level of the genome, looking to mathematically measure the “fitness” of what they call competing and cooperating genes or cells.
But the scientists were not always on the same page even regarding the meaning of such key concepts as cooperation, competition and fitness. The physicists and molecular biologists employed statistical mechanics and game theory to help build explanatory mathematical models of DNA, proteins and entire genomes. That preoccupation with micro-quantification sometimes raised the hackles of field-oriented biologists who were more focused on analyzing social behaviors.
Through it all, week after week, the unflappable Extavour kept the conferees focused on the central issue: Precisely what physical mechanisms originally drove single cells to unite for mutual benefit? How can we quantify this benefit in evolutionary terms? A few months after the conference, Quanta Magazine interviewed Extavour about her own point of view. This is a condensed and edited version of that interview, incorporating a portion of her final talk at Kavli.QUANTA MAGAZINE: Are you evo, devo or eco?
CASSANDRA EXTAVOUR: As a developmental biologist — a devo, if you will — I am intrigued by how cells become eggs and sperm in multicellular creatures. During the development of an animal embryo from a fertilized egg cell, a process that’s called embryogenesis, only a tiny percentage of the millions of genetically identical cells that make up the embryo will become gametes capable of passing their genomes on to successive generations.
Most of an embryo’s cells become soma: cells capable of forming vital organs, muscle, skin and bones. Somatic cells reproduce by dividing — genetically mirroring themselves — but they cannot contribute their specific genomes to the formation of new creatures through sexual reproduction. That is solely the job of the succession of cells in what we call a germ line.
In a sense, the somatic cells sacrifice their genetic “immortality” to protect the germ-line cells. And this primal division of reproductive labor has evolutionary consequences: It allows sexual reproduction and fosters genetic diversity and the evolution of multicellularity.
Now the eco in evo-devo-eco comes into play. The core problem in the study of the development of multicellular organisms is: Why do cells that start out with identical genomes do different things in different environments?
How do you track the development of germ cells in the lab?
We dissect the ovaries and embryos of spiders, crickets and milkweed bugs, using molecular biology and microscopy tools to map the genetic mechanisms that guide the emergence of germ cells. In some organisms, the assignment of a cell to the germ line is caused by an inheritance-based mechanism: Before there is even an embryo, the molecular content of some cells predetermines them to develop as either germ or soma. In other organisms, there is instead a signaling mechanism: An embryonic cell receives chemical signals from neighboring cells that activate (or repress) the genes that allow for germ-line function.
Why bother to be multicellular?
The evolution of a distinct germ line that is protected by the diverse somatic functions of the organism is thought to confer an evolutionary advantage — what we call a fitness benefit — to multicellular organisms, whether plant or animal or slime mold.
How so? . . .